Dictionary

Wild West

noun

Definition of WILD WEST

:  the western United States in its frontier period characterized by roughness and lawlessness
Wild West adjective
ADVERTISEMENT

First Known Use of WILD WEST

1844

Browse

Next Word in the Dictionary: wild wheatPrevious Word in the Dictionary: wild wallAll Words Near: Wild West
ADVERTISEMENT
How to use a word that (literally) drives some people nuts.
Test your vocab with our fun, fast game
Ailurophobia, and 9 other unusual fears